Training MLPs layer-by-layer with the information potential

نویسندگان

  • Dongxin Xu
  • José Carlos Príncipe
چکیده

In the area of information processing one fundamental issue is how to measure the statistical relationship between two variables based only on their samples. In a previous paper, the idea of Information Potential which was formulated from the so called Quadratic Mutual Information was introduced, and successfully applied to problems such as Blind Source Separation and Pose Estimation of SAR (Synthetic Aperture Radar) Images. This paper shows how information potential can be used to train a MLP (multilayer perceptron) layer-by-layer, which provides evidence that the hidden layer of a MLP serves as an “information filter” which tries to best represent the desired output in that layer in the statistical sense of mutual information.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Training Multi-layer Perceptrons Using MiniMin Approach

Multi-layer perceptrons (MLPs) have been widely used in classification and regression task. How to improve the training speed of MLPs has been an interesting field of research. Instead of the classical method, we try to train MLPs by a MiniMin model which can ensure that the weights of the last layer are optimal at each step. Significant improvement on training speed has been made using our met...

متن کامل

Modeling of measurement error in refractive index determination of fuel cell using neural network and genetic algorithm

Abstract: In this paper, a method for determination of refractive index in membrane of fuel cell on basis of three-longitudinal-mode laser heterodyne interferometer is presented. The optical path difference between the target and reference paths is fixed and phase shift is then calculated in terms of refractive index shift. The measurement accuracy of this system is limited by nonlinearity erro...

متن کامل

Multi-layer Perceptrons with Embedded Feature Selection with Application in Cancer Classification

∗ This research was supported by the National Natural Science Foundation of China under Grant No. 60372050. Abstract — This paper proposed a novel neural network model, named multi-layer perceptrons with embedded feature selection (MLPs-EFS), where feature selection is incorporated into the training procedure. Compared with the classical MLPs, MLPs-EFS add a preprocessing step where each featur...

متن کامل

Comparison and Combination of Multilayer Perceptrons and Deep Belief Networks in Hybrid Automatic Speech Recognition Systems

To improve the speech recognition performance, many ways to augment or combine HMMs (Hidden Markov Models) with other models to build hybrid architectures have been proposed. The hybrid HMM/ANN (Hidden Markov Model / Artificial Neural Network) architecture is one of the most successful approaches. In this hybrid model, ANNs (which are often multilayer perceptron neural networks MLPs) are used a...

متن کامل

Sparse one hidden layer MLPs

We discuss how to build sparse one hidden layer MLP replacing the standard l2 weight decay penalty on all weights by an l1 penalty on the linear output weights. We will propose an iterative two step training procedure where the output weights are found using FISTA proximal optimization algorithm to solve a Lasso-like problem and the hidden weights are computed by unconstrained minimization. As ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999